# Small-scale parameters
News Summarizer T5
Apache-2.0
A news summarization generation model fine-tuned based on google-t5/t5-small, supporting concise summarization of news texts.
Text Generation
Transformers

N
SurAyush
62
0
Gpt2 Demo
Other
GPT-2 is a self-supervised pre-trained language model based on the Transformer architecture, which excels at text generation tasks.
Large Language Model
Transformers

G
demo-leaderboard
19.21k
1
Bloom 350m German
MIT
A BLOOM-350m language model trained from scratch on German data, a small-scale version of the BLOOM series, focused on German text generation tasks.
Large Language Model
Transformers German

B
malteos
26
0
Danbert Small Cased
Apache-2.0
DanBERT is a Danish pre-trained model based on the BERT-Base architecture, trained on over 2 million Danish sentences.
Large Language Model Supports Multiple Languages
D
alexanderfalk
18
1
Gpt Neo 125m
MIT
GPT-Neo 125M is a Transformer model based on the GPT-3 architecture, developed by EleutherAI, with 125 million parameters, primarily used for English text generation tasks.
Large Language Model English
G
EleutherAI
150.96k
204
Featured Recommended AI Models